50 research outputs found

    Nonparametric estimation of mean and dispersion functions in extended generalized linear models.

    Get PDF
    In this paper the interest is in regression analysis for data that show possibly overdispersion or underdispersion. The starting point for modeling are generalized linear models in which we no longer admit a linear form for the mean regression function, but allow it to be any smooth function of the covariate(s). In view of analyzing overdispersed or underdispersed data, we additionally bring in an unknown dispersion function. The mean regression function and the dispersion function are then estimated using P-splines with difference type of penalty to prevent from overfitting. We discuss two approaches: one based on an extended quasi-likelihood idea and one based on a pseudo-likelihood approach. The choices of smoothing parameters and implementation issues are discussed. The performance of the estimation method is investigated via simulations and its use is illustrated on several data, including continuous data, counts and proportions.Double exponential family; Extended quasi-likelihood; Modeling; Overdispersion; Pseudo likelihood; P-splines; Regression; Variance estimation; Underdispersion;

    Robust estimation of mean and dispersion functions in extended generalized additive models.

    Get PDF
    Generalized Linear Models are a widely used method to obtain parametric estimates for the mean function. They have been further extended to allow the relationship between the mean function and the covariates to be more flexible via Generalized Additive Models. However the fixed variance structure can in many cases be too restrictive. The Extended Quasi-Likelihood (EQL) framework allows for estimation of both the mean and the dispersion/variance as functions of covariates. As for other maximum likelihood methods though, EQL estimates are not resistant to outliers: we need methods to obtain robust estimates for both the mean and the dispersion function. In this paper we obtain functional estimates for the mean and the dispersion that are both robust and smooth. The performance of the proposed method is illustrated via a simulation study and some real data examples.Dispersion; Generalized additive modelling; Mean regression function; M-estimation; P-splines; Robust estimation;

    Parametrisation of change-permitting extreme value models and its impact on the description of change

    Get PDF
    The potential for changes in environmental extremes is routinely investigated by fitting change-permitting extreme value models to long-term observations, allowing one or more distribution parameters to change as a function of time or some other covariate. In most extreme value analyses, the main quantity of interest is typically the upper quantiles of the distribution, which are often needed for practical applications such as engineering design. This study focuses on the changes in quantile estimates under different change-permitting models. First, metrics which measure the impact of changes in parameters on changes in quantiles are introduced. The mathematical structure of these change metrics is investigated for several change-permitting models based on the Generalised Extreme Value (GEV) distribution. It is shown that for the most commonly used models, the predicted changes in the quantiles are a non-intuitive function of the distribution parameters, leading to results which are difficult to interpret. Next, it is posited that commonly used change-permitting GEV models do not preserve a constant coefficient of variation, a property that is typically assumed to hold for environmental extremes. To address these shortcomings a new (parsimonious) model is proposed: the model assumes a constant coefficient of variation, allowing the location and scale parameters to change simultaneously. The proposed model results in changes in the quantile function that are easier to interpret. Finally, the consequences of the different modelling choices on quantile estimates are exemplified using a dataset of extreme peak river flow measurements in Massachusetts, USA. It is argued that the decision on which model structure to adopt to describe change in extremes should also take into consideration any requirements on the behaviour of the quantiles of interest

    Attribution of long-term changes in peak river flows in Great Britain

    Get PDF
    We investigate the evidence for changes in the magnitude of peak river flows in Great Britain. We focus on a set of 117 near-natural "benchmark" catchments to detect trends not driven by land use and other human impacts, and aim to attribute trends in peak river flows to some climate indices such as the North Atlantic Oscillation (NAO) and the East Atlantic (EA) index. We propose modelling all stations together in a Bayesian multilevel framework to be better able to detect any signal that is present in the data by pooling information across several stations. This approach leads to the detection of a clear countrywide time trend. Additionally, in a univariate approach, both the EA and NAO indices appear to have a considerable association with peak river flows. When a multivariate approach is taken to unmask the collinearity between climate indices and time, the association between NAO and peak flows disappears, while the association with EA remains clear. This demonstrates the usefulness of a multivariate and multilevel approach when it comes to accurately attributing trends in peak river flows

    Going Beyond the Ensemble Mean: Assessment of Future Floods From Global Multi‐Models

    Get PDF
    Future changes in the occurrence of flood events can be estimated using multi-model ensembles to inform adaption and mitigation strategies. In the near future, these estimates could be used to guide the updating of exceedance probabilities for flood control design and water resources management. However, the estimate of return levels from ensemble experiments represents a challenge: model runs are affected by biases and uncertainties and by inconsistencies in simulated peak flows when compared with observed data. Moreover, extreme value distributions are generally fit to ensemble members individually and then averaged to obtain the ensemble fit with loss of information. To overcome these limitations, we propose a Bayesian hierarchical model for assessing changes in future peak flows, and the uncertainty coming from global climate, global impact models and their interaction. The model we propose allows use of all members of the ensemble at once for estimating changes in the parameters of an extreme value distribution from historical to future peak flows. The approach is applied to a set of grid-cells in the eastern United States to the full and to a constrained version of the ensemble. We find that, while the dominant source of uncertainty in the changes varies across the domain, there is a consensus on a decrease in flood magnitudes toward the south. We conclude that projecting future flood magnitude under climate change remains elusive due to large uncertainty mostly coming from global models and from the intrinsic uncertain nature of extreme values

    FEH Local: improving flood estimates using historical data

    Get PDF
    The traditional approach to design flood estimation (for example, to derive the 100-year flood) is to apply a statistical model to time series of peak river flow measured by gauging stations. Such records are typically not very long, for example in the UK only about 10% of the stations have records that are more than 50 years in length. Along-explored way to augment the data available from a gauging station is to derive information about historical flood events and paleo-floods, which can be obtained from careful exploration of archives, old newspapers, flood marks or other signs of past flooding that are still discernible in the catchment, and the history of settlements. The inclusion of historical data in flood frequency estimation has been shown to substantially reduce the uncertainty around the estimated design events and is likely to provide insight into the rarest events which might have pre-dated the relatively short systematic records. Among other things, the FEH Local project funded by the Environment Agency aims to develop methods to easily incorporate historical information into the standard method of statistical flood frequency estimation in the UK. Different statistical estimation procedures are explored, namely maximum likelihood and partial probability weighted moments, and the strengths and weaknesses of each method are investigated. The project assesses the usefulness of historical data and aims to provide practitioners with useful guidelines to indicate in what circumstances the inclusion of historical data is likely to be beneficial in terms of reducing both the bias and the variability of the estimated flood frequency curves. The guidelines are based on the results of a large Monte Carlo simulation study, in which different estimation procedures and different data availability scenarios are studied. The study provides some indication of the situations under which different estimation procedures might give a better performance

    Space-time extreme rainfall simulation under a geostatistical approach

    Get PDF
    In this work we illustrate an approach to simulate extreme events with high resolution in space. First, we model spatio-temporal variability in the marginal distributions with a flexible semi-parametric specification. Then the Gaussian copula is used to model locally in time and space the extremal dependence. The methods are showcased with an application to daily precipitations in the Venice lagoon catchment
    corecore